Goto

Collaborating Authors

 ai-assisted genocide


'AI-assisted genocide': Israel reportedly used database for Gaza kill lists

Al Jazeera

The Israeli military's reported use of an untested and undisclosed artificial intelligence-powered database to identify targets for its bombing campaign in Gaza has alarmed human rights and technology experts who said it could amount to "war crimes". The Israeli-Palestinian publication 972 Magazine and Hebrew-language media outlet Local Call reported recently that the Israeli army was isolating and identifying thousands of Palestinians as potential bombing targets using an AI-assisted targeting system called Lavender. "That database is responsible for drawing up kill lists of as many as 37,000 targets," Al Jazeera's Rory Challands, reporting from occupied East Jerusalem, said on Thursday. The unnamed Israeli intelligence officials who spoke to the media outlets said Lavender had an error rate of about 10 percent. "But that didn't stop the Israelis from using it to fast-track the identification of often low-level Hamas operatives in Gaza and bombing them," Challands said.

  Country:
  Industry:

Fact or fiction? Israeli maps and AI do not save Palestinian lives

Al Jazeera

On December 2, the Israeli army's Arabic-language spokesperson Avichay Adraee posted a map of Gaza, broken up into a grid of numbered blocks with instructions that Palestinians living in certain areas evacuate to Rafah. Leaflets containing a QR code linking to the map on the Israeli army's website were also dropped over Gaza. This move came as Israeli fighter jets bombarded the south of the Strip – previously designated as a "safe zone" – killing hundreds of Palestinians in 24 hours. The Israeli army proudly announced that it had hit "400 targets". Meanwhile, media reports revealed that the Israeli army's ability to intensify what it calls "precision" air strikes has been boosted by an artificial intelligence (AI) tool that generates "targets".